Goto

Collaborating Authors

 local minimax rate


Optimization of Smooth Functions with Noisy Observations: Local Minimax Rates

Neural Information Processing Systems

We consider the problem of global optimization of an unknown non-convex smooth function with noisy zeroth-order feedback. We propose a local minimax framework to study the fundamental difficulty of optimizing smooth functions with adaptive function evaluations. We show that for functions with fast growth around their global minima, carefully designed optimization algorithms can identify a near global minimizer with many fewer queries than worst-case global minimax theory predicts. For the special case of strongly convex and smooth functions, our implied convergence rates match the ones developed for zeroth-order convex optimization problems. On the other hand, we show that in the worst case no algorithm can converge faster than the minimax rate of estimating an unknown functions in linf-norm. Finally, we show that non-adaptive algorithms, although optimal in a global minimax sense, do not attain the optimal local minimax rate.



Reviews: Optimization of Smooth Functions with Noisy Observations: Local Minimax Rates

Neural Information Processing Systems

In the traditional setting of global optimization, the algorithmic goal is to design an adaptive procedure to find an approximate (in this case, global) optimum of an unknown function f, given access to noisy evaluations at feasible points. This notion of complexity might be too rough to understand the difficulty of the problem, so it is proposed to study the local minimax complexity, when the algorithm may have additional information on how close is the objective to a fixed function f_0. In practice, the algorithm doesn't need to know f_0, but this parameterization serves as an instance-dependent notion of complexity. This paper additionally considers how the regularity of the function (given by its Holder continuity) improves the complexity, as it is well-known that for Lipschitz functions adaptivity does not help. The main contributions of the paper can be summarized as follows: 1.


Optimization of Smooth Functions with Noisy Observations: Local Minimax Rates

Wang, Yining, Balakrishnan, Sivaraman, Singh, Aarti

Neural Information Processing Systems

We consider the problem of global optimization of an unknown non-convex smooth function with noisy zeroth-order feedback. We propose a local minimax framework to study the fundamental difficulty of optimizing smooth functions with adaptive function evaluations. We show that for functions with fast growth around their global minima, carefully designed optimization algorithms can identify a near global minimizer with many fewer queries than worst-case global minimax theory predicts. For the special case of strongly convex and smooth functions, our implied convergence rates match the ones developed for zeroth-order convex optimization problems. On the other hand, we show that in the worst case no algorithm can converge faster than the minimax rate of estimating an unknown functions in linf-norm.